[ Wed Sep 28 02:20:20 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:21:34 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/cview/fc_joint', 'model_saved_name': 'work_dir/ntu60/cview/fc_joint/runs', 'config': 'config/nturgbd-cross-view/fc.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CV.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': False, 'bone': False}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CV.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': False, 'bone': False, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [4], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:21:34 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:21:34 2022 ] Training epoch: 1
[ Wed Sep 28 02:24:34 2022 ] 	Mean training loss: 2.6876. loss2: 0.0000. Mean training acc: 28.96%.
[ Wed Sep 28 02:24:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:24:34 2022 ] Eval epoch: 1
[ Wed Sep 28 02:25:08 2022 ] 	Mean test loss of 296 batches: 1.7049396996562545.
[ Wed Sep 28 02:25:08 2022 ] 	Top1: 47.75%
[ Wed Sep 28 02:25:08 2022 ] 	Top5: 87.06%
[ Wed Sep 28 02:25:08 2022 ] Training epoch: 2
[ Wed Sep 28 02:28:06 2022 ] 	Mean training loss: 1.6602. loss2: 0.0000. Mean training acc: 50.82%.
[ Wed Sep 28 02:28:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:28:06 2022 ] Eval epoch: 2
[ Wed Sep 28 02:28:40 2022 ] 	Mean test loss of 296 batches: 1.3009400365723145.
[ Wed Sep 28 02:28:40 2022 ] 	Top1: 60.15%
[ Wed Sep 28 02:28:40 2022 ] 	Top5: 90.84%
[ Wed Sep 28 02:28:40 2022 ] Training epoch: 3
[ Wed Sep 28 02:31:38 2022 ] 	Mean training loss: 1.3189. loss2: 0.0000. Mean training acc: 60.56%.
[ Wed Sep 28 02:31:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:31:38 2022 ] Eval epoch: 3
[ Wed Sep 28 02:32:11 2022 ] 	Mean test loss of 296 batches: 0.9370890358412588.
[ Wed Sep 28 02:32:11 2022 ] 	Top1: 70.42%
[ Wed Sep 28 02:32:11 2022 ] 	Top5: 95.48%
[ Wed Sep 28 02:32:11 2022 ] Training epoch: 4
[ Wed Sep 28 02:35:09 2022 ] 	Mean training loss: 1.1476. loss2: 0.0000. Mean training acc: 65.20%.
[ Wed Sep 28 02:35:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:35:09 2022 ] Eval epoch: 4
[ Wed Sep 28 02:35:42 2022 ] 	Mean test loss of 296 batches: 0.8705730734241975.
[ Wed Sep 28 02:35:42 2022 ] 	Top1: 72.62%
[ Wed Sep 28 02:35:43 2022 ] 	Top5: 95.73%
[ Wed Sep 28 02:35:43 2022 ] Training epoch: 5
[ Wed Sep 28 02:38:40 2022 ] 	Mean training loss: 1.0430. loss2: 0.0000. Mean training acc: 68.01%.
[ Wed Sep 28 02:38:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:38:40 2022 ] Eval epoch: 5
[ Wed Sep 28 02:39:14 2022 ] 	Mean test loss of 296 batches: 0.8477446117715256.
[ Wed Sep 28 02:39:14 2022 ] 	Top1: 72.36%
[ Wed Sep 28 02:39:14 2022 ] 	Top5: 95.84%
[ Wed Sep 28 02:39:14 2022 ] Training epoch: 6
[ Wed Sep 28 02:42:12 2022 ] 	Mean training loss: 0.9167. loss2: 0.0000. Mean training acc: 71.45%.
[ Wed Sep 28 02:42:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:42:12 2022 ] Eval epoch: 6
[ Wed Sep 28 02:42:46 2022 ] 	Mean test loss of 296 batches: 0.6688667355558356.
[ Wed Sep 28 02:42:46 2022 ] 	Top1: 79.43%
[ Wed Sep 28 02:42:46 2022 ] 	Top5: 96.84%
[ Wed Sep 28 02:42:46 2022 ] Training epoch: 7
[ Wed Sep 28 02:45:44 2022 ] 	Mean training loss: 0.8453. loss2: 0.0000. Mean training acc: 73.51%.
[ Wed Sep 28 02:45:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:45:44 2022 ] Eval epoch: 7
[ Wed Sep 28 02:46:17 2022 ] 	Mean test loss of 296 batches: 0.6727774044549143.
[ Wed Sep 28 02:46:17 2022 ] 	Top1: 78.21%
[ Wed Sep 28 02:46:17 2022 ] 	Top5: 97.10%
[ Wed Sep 28 02:46:17 2022 ] Training epoch: 8
[ Wed Sep 28 02:49:15 2022 ] 	Mean training loss: 0.7973. loss2: 0.0000. Mean training acc: 75.23%.
[ Wed Sep 28 02:49:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:49:15 2022 ] Eval epoch: 8
[ Wed Sep 28 02:49:49 2022 ] 	Mean test loss of 296 batches: 0.6427950523793697.
[ Wed Sep 28 02:49:49 2022 ] 	Top1: 79.38%
[ Wed Sep 28 02:49:49 2022 ] 	Top5: 97.49%
[ Wed Sep 28 02:49:49 2022 ] Training epoch: 9
[ Wed Sep 28 02:52:47 2022 ] 	Mean training loss: 0.7529. loss2: 0.0000. Mean training acc: 76.57%.
[ Wed Sep 28 02:52:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:52:47 2022 ] Eval epoch: 9
[ Wed Sep 28 02:53:21 2022 ] 	Mean test loss of 296 batches: 0.6611180395089291.
[ Wed Sep 28 02:53:21 2022 ] 	Top1: 78.61%
[ Wed Sep 28 02:53:21 2022 ] 	Top5: 97.31%
[ Wed Sep 28 02:53:21 2022 ] Training epoch: 10
[ Wed Sep 28 02:56:19 2022 ] 	Mean training loss: 0.7428. loss2: 0.0000. Mean training acc: 77.06%.
[ Wed Sep 28 02:56:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:56:19 2022 ] Eval epoch: 10
[ Wed Sep 28 02:56:53 2022 ] 	Mean test loss of 296 batches: 0.6466957617651772.
[ Wed Sep 28 02:56:53 2022 ] 	Top1: 79.25%
[ Wed Sep 28 02:56:53 2022 ] 	Top5: 96.88%
[ Wed Sep 28 02:56:53 2022 ] Training epoch: 11
[ Wed Sep 28 02:59:51 2022 ] 	Mean training loss: 0.7202. loss2: 0.0000. Mean training acc: 77.59%.
[ Wed Sep 28 02:59:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:59:51 2022 ] Eval epoch: 11
[ Wed Sep 28 03:00:24 2022 ] 	Mean test loss of 296 batches: 0.7811942840548786.
[ Wed Sep 28 03:00:25 2022 ] 	Top1: 75.42%
[ Wed Sep 28 03:00:25 2022 ] 	Top5: 96.36%
[ Wed Sep 28 03:00:25 2022 ] Training epoch: 12
[ Wed Sep 28 03:03:23 2022 ] 	Mean training loss: 0.6889. loss2: 0.0000. Mean training acc: 78.24%.
[ Wed Sep 28 03:03:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:03:23 2022 ] Eval epoch: 12
[ Wed Sep 28 03:03:56 2022 ] 	Mean test loss of 296 batches: 0.8167405301654661.
[ Wed Sep 28 03:03:57 2022 ] 	Top1: 74.79%
[ Wed Sep 28 03:03:57 2022 ] 	Top5: 95.67%
[ Wed Sep 28 03:03:57 2022 ] Training epoch: 13
[ Wed Sep 28 03:06:54 2022 ] 	Mean training loss: 0.6820. loss2: 0.0000. Mean training acc: 78.52%.
[ Wed Sep 28 03:06:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:06:54 2022 ] Eval epoch: 13
[ Wed Sep 28 03:07:28 2022 ] 	Mean test loss of 296 batches: 0.5914705491851311.
[ Wed Sep 28 03:07:28 2022 ] 	Top1: 81.37%
[ Wed Sep 28 03:07:28 2022 ] 	Top5: 97.61%
[ Wed Sep 28 03:07:28 2022 ] Training epoch: 14
[ Wed Sep 28 03:10:26 2022 ] 	Mean training loss: 0.6713. loss2: 0.0000. Mean training acc: 78.90%.
[ Wed Sep 28 03:10:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:10:26 2022 ] Eval epoch: 14
[ Wed Sep 28 03:11:00 2022 ] 	Mean test loss of 296 batches: 0.9671270274431318.
[ Wed Sep 28 03:11:00 2022 ] 	Top1: 72.04%
[ Wed Sep 28 03:11:00 2022 ] 	Top5: 94.88%
[ Wed Sep 28 03:11:00 2022 ] Training epoch: 15
[ Wed Sep 28 03:13:58 2022 ] 	Mean training loss: 0.6469. loss2: 0.0000. Mean training acc: 79.64%.
[ Wed Sep 28 03:13:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:13:58 2022 ] Eval epoch: 15
[ Wed Sep 28 03:14:31 2022 ] 	Mean test loss of 296 batches: 0.6264833074465797.
[ Wed Sep 28 03:14:32 2022 ] 	Top1: 79.95%
[ Wed Sep 28 03:14:32 2022 ] 	Top5: 97.25%
[ Wed Sep 28 03:14:32 2022 ] Training epoch: 16
[ Wed Sep 28 03:17:30 2022 ] 	Mean training loss: 0.6487. loss2: 0.0000. Mean training acc: 79.83%.
[ Wed Sep 28 03:17:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:17:30 2022 ] Eval epoch: 16
[ Wed Sep 28 03:18:04 2022 ] 	Mean test loss of 296 batches: 0.6647569345360672.
[ Wed Sep 28 03:18:04 2022 ] 	Top1: 78.93%
[ Wed Sep 28 03:18:04 2022 ] 	Top5: 97.06%
[ Wed Sep 28 03:18:04 2022 ] Training epoch: 17
[ Wed Sep 28 03:21:03 2022 ] 	Mean training loss: 0.6326. loss2: 0.0000. Mean training acc: 80.25%.
[ Wed Sep 28 03:21:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:21:03 2022 ] Eval epoch: 17
[ Wed Sep 28 03:21:37 2022 ] 	Mean test loss of 296 batches: 0.7696651747903308.
[ Wed Sep 28 03:21:37 2022 ] 	Top1: 76.53%
[ Wed Sep 28 03:21:37 2022 ] 	Top5: 96.04%
[ Wed Sep 28 03:21:37 2022 ] Training epoch: 18
[ Wed Sep 28 03:24:35 2022 ] 	Mean training loss: 0.6277. loss2: 0.0000. Mean training acc: 80.23%.
[ Wed Sep 28 03:24:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:24:35 2022 ] Eval epoch: 18
[ Wed Sep 28 03:25:08 2022 ] 	Mean test loss of 296 batches: 0.7929946923175374.
[ Wed Sep 28 03:25:09 2022 ] 	Top1: 75.92%
[ Wed Sep 28 03:25:09 2022 ] 	Top5: 95.52%
[ Wed Sep 28 03:25:09 2022 ] Training epoch: 19
[ Wed Sep 28 03:28:07 2022 ] 	Mean training loss: 0.6103. loss2: 0.0000. Mean training acc: 80.95%.
[ Wed Sep 28 03:28:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:28:07 2022 ] Eval epoch: 19
[ Wed Sep 28 03:28:40 2022 ] 	Mean test loss of 296 batches: 0.5815233237840034.
[ Wed Sep 28 03:28:41 2022 ] 	Top1: 81.40%
[ Wed Sep 28 03:28:41 2022 ] 	Top5: 97.48%
[ Wed Sep 28 03:28:41 2022 ] Training epoch: 20
[ Wed Sep 28 03:31:39 2022 ] 	Mean training loss: 0.6060. loss2: 0.0000. Mean training acc: 80.87%.
[ Wed Sep 28 03:31:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:31:39 2022 ] Eval epoch: 20
[ Wed Sep 28 03:32:12 2022 ] 	Mean test loss of 296 batches: 0.7892189176904189.
[ Wed Sep 28 03:32:12 2022 ] 	Top1: 75.85%
[ Wed Sep 28 03:32:12 2022 ] 	Top5: 96.87%
[ Wed Sep 28 03:32:12 2022 ] Training epoch: 21
[ Wed Sep 28 03:35:10 2022 ] 	Mean training loss: 0.5942. loss2: 0.0000. Mean training acc: 81.30%.
[ Wed Sep 28 03:35:10 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:35:10 2022 ] Eval epoch: 21
[ Wed Sep 28 03:35:44 2022 ] 	Mean test loss of 296 batches: 0.4520778152084834.
[ Wed Sep 28 03:35:44 2022 ] 	Top1: 85.53%
[ Wed Sep 28 03:35:44 2022 ] 	Top5: 98.48%
[ Wed Sep 28 03:35:44 2022 ] Training epoch: 22
[ Wed Sep 28 03:38:42 2022 ] 	Mean training loss: 0.5969. loss2: 0.0000. Mean training acc: 81.11%.
[ Wed Sep 28 03:38:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:38:42 2022 ] Eval epoch: 22
[ Wed Sep 28 03:39:16 2022 ] 	Mean test loss of 296 batches: 0.6470672926185904.
[ Wed Sep 28 03:39:16 2022 ] 	Top1: 79.73%
[ Wed Sep 28 03:39:16 2022 ] 	Top5: 96.94%
[ Wed Sep 28 03:39:16 2022 ] Training epoch: 23
[ Wed Sep 28 03:42:14 2022 ] 	Mean training loss: 0.5858. loss2: 0.0000. Mean training acc: 81.59%.
[ Wed Sep 28 03:42:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:42:14 2022 ] Eval epoch: 23
[ Wed Sep 28 03:42:48 2022 ] 	Mean test loss of 296 batches: 0.674074789962253.
[ Wed Sep 28 03:42:48 2022 ] 	Top1: 77.96%
[ Wed Sep 28 03:42:48 2022 ] 	Top5: 97.38%
[ Wed Sep 28 03:42:48 2022 ] Training epoch: 24
[ Wed Sep 28 03:45:46 2022 ] 	Mean training loss: 0.5931. loss2: 0.0000. Mean training acc: 81.36%.
[ Wed Sep 28 03:45:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:45:46 2022 ] Eval epoch: 24
[ Wed Sep 28 03:46:19 2022 ] 	Mean test loss of 296 batches: 0.661426027783671.
[ Wed Sep 28 03:46:19 2022 ] 	Top1: 79.86%
[ Wed Sep 28 03:46:19 2022 ] 	Top5: 96.89%
[ Wed Sep 28 03:46:19 2022 ] Training epoch: 25
[ Wed Sep 28 03:49:17 2022 ] 	Mean training loss: 0.5873. loss2: 0.0000. Mean training acc: 81.43%.
[ Wed Sep 28 03:49:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:49:17 2022 ] Eval epoch: 25
[ Wed Sep 28 03:49:51 2022 ] 	Mean test loss of 296 batches: 0.5077977419100903.
[ Wed Sep 28 03:49:51 2022 ] 	Top1: 83.98%
[ Wed Sep 28 03:49:51 2022 ] 	Top5: 97.74%
[ Wed Sep 28 03:49:51 2022 ] Training epoch: 26
[ Wed Sep 28 03:52:49 2022 ] 	Mean training loss: 0.5731. loss2: 0.0000. Mean training acc: 81.96%.
[ Wed Sep 28 03:52:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:52:49 2022 ] Eval epoch: 26
[ Wed Sep 28 03:53:23 2022 ] 	Mean test loss of 296 batches: 0.5044775246768385.
[ Wed Sep 28 03:53:23 2022 ] 	Top1: 84.47%
[ Wed Sep 28 03:53:23 2022 ] 	Top5: 97.89%
[ Wed Sep 28 03:53:23 2022 ] Training epoch: 27
[ Wed Sep 28 03:56:21 2022 ] 	Mean training loss: 0.5773. loss2: 0.0000. Mean training acc: 81.49%.
[ Wed Sep 28 03:56:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:56:21 2022 ] Eval epoch: 27
[ Wed Sep 28 03:56:55 2022 ] 	Mean test loss of 296 batches: 0.6154179265974341.
[ Wed Sep 28 03:56:55 2022 ] 	Top1: 80.49%
[ Wed Sep 28 03:56:55 2022 ] 	Top5: 97.62%
[ Wed Sep 28 03:56:55 2022 ] Training epoch: 28
[ Wed Sep 28 03:59:53 2022 ] 	Mean training loss: 0.5679. loss2: 0.0000. Mean training acc: 82.14%.
[ Wed Sep 28 03:59:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:59:53 2022 ] Eval epoch: 28
[ Wed Sep 28 04:00:27 2022 ] 	Mean test loss of 296 batches: 0.5984178087296518.
[ Wed Sep 28 04:00:27 2022 ] 	Top1: 80.38%
[ Wed Sep 28 04:00:27 2022 ] 	Top5: 97.75%
[ Wed Sep 28 04:00:27 2022 ] Training epoch: 29
[ Wed Sep 28 04:03:25 2022 ] 	Mean training loss: 0.5683. loss2: 0.0000. Mean training acc: 82.20%.
[ Wed Sep 28 04:03:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:03:25 2022 ] Eval epoch: 29
[ Wed Sep 28 04:03:58 2022 ] 	Mean test loss of 296 batches: 0.7321506981310006.
[ Wed Sep 28 04:03:58 2022 ] 	Top1: 77.67%
[ Wed Sep 28 04:03:58 2022 ] 	Top5: 95.72%
[ Wed Sep 28 04:03:58 2022 ] Training epoch: 30
[ Wed Sep 28 04:06:57 2022 ] 	Mean training loss: 0.5606. loss2: 0.0000. Mean training acc: 82.25%.
[ Wed Sep 28 04:06:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:06:57 2022 ] Eval epoch: 30
[ Wed Sep 28 04:07:30 2022 ] 	Mean test loss of 296 batches: 0.5044230662185598.
[ Wed Sep 28 04:07:30 2022 ] 	Top1: 84.26%
[ Wed Sep 28 04:07:30 2022 ] 	Top5: 98.13%
[ Wed Sep 28 04:07:30 2022 ] Training epoch: 31
[ Wed Sep 28 04:10:28 2022 ] 	Mean training loss: 0.5672. loss2: 0.0000. Mean training acc: 82.20%.
[ Wed Sep 28 04:10:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:10:28 2022 ] Eval epoch: 31
[ Wed Sep 28 04:11:02 2022 ] 	Mean test loss of 296 batches: 0.5105747015693703.
[ Wed Sep 28 04:11:02 2022 ] 	Top1: 83.70%
[ Wed Sep 28 04:11:02 2022 ] 	Top5: 98.14%
[ Wed Sep 28 04:11:02 2022 ] Training epoch: 32
[ Wed Sep 28 04:14:00 2022 ] 	Mean training loss: 0.5509. loss2: 0.0000. Mean training acc: 82.61%.
[ Wed Sep 28 04:14:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:14:00 2022 ] Eval epoch: 32
[ Wed Sep 28 04:14:34 2022 ] 	Mean test loss of 296 batches: 0.677149392563749.
[ Wed Sep 28 04:14:34 2022 ] 	Top1: 79.65%
[ Wed Sep 28 04:14:34 2022 ] 	Top5: 96.71%
[ Wed Sep 28 04:14:34 2022 ] Training epoch: 33
[ Wed Sep 28 04:17:32 2022 ] 	Mean training loss: 0.5559. loss2: 0.0000. Mean training acc: 82.50%.
[ Wed Sep 28 04:17:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:17:32 2022 ] Eval epoch: 33
[ Wed Sep 28 04:18:05 2022 ] 	Mean test loss of 296 batches: 0.5262885422420662.
[ Wed Sep 28 04:18:05 2022 ] 	Top1: 82.41%
[ Wed Sep 28 04:18:06 2022 ] 	Top5: 98.27%
[ Wed Sep 28 04:18:06 2022 ] Training epoch: 34
[ Wed Sep 28 04:21:04 2022 ] 	Mean training loss: 0.5505. loss2: 0.0000. Mean training acc: 82.67%.
[ Wed Sep 28 04:21:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:21:04 2022 ] Eval epoch: 34
[ Wed Sep 28 04:21:38 2022 ] 	Mean test loss of 296 batches: 0.5281892784342572.
[ Wed Sep 28 04:21:38 2022 ] 	Top1: 83.66%
[ Wed Sep 28 04:21:38 2022 ] 	Top5: 97.62%
[ Wed Sep 28 04:21:38 2022 ] Training epoch: 35
[ Wed Sep 28 04:24:36 2022 ] 	Mean training loss: 0.5506. loss2: 0.0000. Mean training acc: 82.54%.
[ Wed Sep 28 04:24:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:24:36 2022 ] Eval epoch: 35
[ Wed Sep 28 04:25:09 2022 ] 	Mean test loss of 296 batches: 0.6037412559663927.
[ Wed Sep 28 04:25:09 2022 ] 	Top1: 80.78%
[ Wed Sep 28 04:25:10 2022 ] 	Top5: 97.86%
[ Wed Sep 28 04:25:10 2022 ] Training epoch: 36
[ Wed Sep 28 04:28:08 2022 ] 	Mean training loss: 0.5480. loss2: 0.0000. Mean training acc: 82.83%.
[ Wed Sep 28 04:28:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:28:08 2022 ] Eval epoch: 36
[ Wed Sep 28 04:28:41 2022 ] 	Mean test loss of 296 batches: 0.46574957930558436.
[ Wed Sep 28 04:28:41 2022 ] 	Top1: 85.12%
[ Wed Sep 28 04:28:41 2022 ] 	Top5: 98.26%
[ Wed Sep 28 04:28:41 2022 ] Training epoch: 37
[ Wed Sep 28 04:31:39 2022 ] 	Mean training loss: 0.5438. loss2: 0.0000. Mean training acc: 82.72%.
[ Wed Sep 28 04:31:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:31:39 2022 ] Eval epoch: 37
[ Wed Sep 28 04:32:13 2022 ] 	Mean test loss of 296 batches: 0.45486807933933027.
[ Wed Sep 28 04:32:13 2022 ] 	Top1: 85.39%
[ Wed Sep 28 04:32:13 2022 ] 	Top5: 98.26%
[ Wed Sep 28 04:32:13 2022 ] Training epoch: 38
[ Wed Sep 28 04:35:11 2022 ] 	Mean training loss: 0.5421. loss2: 0.0000. Mean training acc: 82.91%.
[ Wed Sep 28 04:35:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:35:11 2022 ] Eval epoch: 38
[ Wed Sep 28 04:35:45 2022 ] 	Mean test loss of 296 batches: 0.5962590531420868.
[ Wed Sep 28 04:35:45 2022 ] 	Top1: 81.61%
[ Wed Sep 28 04:35:45 2022 ] 	Top5: 97.25%
[ Wed Sep 28 04:35:45 2022 ] Training epoch: 39
[ Wed Sep 28 04:38:43 2022 ] 	Mean training loss: 0.5345. loss2: 0.0000. Mean training acc: 83.11%.
[ Wed Sep 28 04:38:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:38:43 2022 ] Eval epoch: 39
[ Wed Sep 28 04:39:16 2022 ] 	Mean test loss of 296 batches: 0.46480310315618645.
[ Wed Sep 28 04:39:16 2022 ] 	Top1: 85.42%
[ Wed Sep 28 04:39:16 2022 ] 	Top5: 97.79%
[ Wed Sep 28 04:39:16 2022 ] Training epoch: 40
[ Wed Sep 28 04:42:14 2022 ] 	Mean training loss: 0.5420. loss2: 0.0000. Mean training acc: 83.21%.
[ Wed Sep 28 04:42:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:42:14 2022 ] Eval epoch: 40
[ Wed Sep 28 04:42:48 2022 ] 	Mean test loss of 296 batches: 0.49711170762374596.
[ Wed Sep 28 04:42:48 2022 ] 	Top1: 84.09%
[ Wed Sep 28 04:42:48 2022 ] 	Top5: 98.42%
[ Wed Sep 28 04:42:48 2022 ] Training epoch: 41
[ Wed Sep 28 04:45:47 2022 ] 	Mean training loss: 0.5343. loss2: 0.0000. Mean training acc: 83.04%.
[ Wed Sep 28 04:45:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:45:47 2022 ] Eval epoch: 41
[ Wed Sep 28 04:46:20 2022 ] 	Mean test loss of 296 batches: 0.5609561980173394.
[ Wed Sep 28 04:46:21 2022 ] 	Top1: 82.15%
[ Wed Sep 28 04:46:21 2022 ] 	Top5: 97.56%
[ Wed Sep 28 04:46:21 2022 ] Training epoch: 42
[ Wed Sep 28 04:49:19 2022 ] 	Mean training loss: 0.5376. loss2: 0.0000. Mean training acc: 83.00%.
[ Wed Sep 28 04:49:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:49:19 2022 ] Eval epoch: 42
[ Wed Sep 28 04:49:52 2022 ] 	Mean test loss of 296 batches: 0.6349597020527801.
[ Wed Sep 28 04:49:53 2022 ] 	Top1: 80.09%
[ Wed Sep 28 04:49:53 2022 ] 	Top5: 97.26%
[ Wed Sep 28 04:49:53 2022 ] Training epoch: 43
[ Wed Sep 28 04:52:51 2022 ] 	Mean training loss: 0.5310. loss2: 0.0000. Mean training acc: 83.43%.
[ Wed Sep 28 04:52:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:52:51 2022 ] Eval epoch: 43
[ Wed Sep 28 04:53:25 2022 ] 	Mean test loss of 296 batches: 0.519523137374907.
[ Wed Sep 28 04:53:25 2022 ] 	Top1: 83.53%
[ Wed Sep 28 04:53:25 2022 ] 	Top5: 97.92%
[ Wed Sep 28 04:53:25 2022 ] Training epoch: 44
[ Wed Sep 28 04:56:23 2022 ] 	Mean training loss: 0.5328. loss2: 0.0000. Mean training acc: 83.19%.
[ Wed Sep 28 04:56:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:56:23 2022 ] Eval epoch: 44
[ Wed Sep 28 04:56:56 2022 ] 	Mean test loss of 296 batches: 0.5188664896161975.
[ Wed Sep 28 04:56:56 2022 ] 	Top1: 83.75%
[ Wed Sep 28 04:56:56 2022 ] 	Top5: 98.35%
[ Wed Sep 28 04:56:56 2022 ] Training epoch: 45
[ Wed Sep 28 04:59:54 2022 ] 	Mean training loss: 0.5338. loss2: 0.0000. Mean training acc: 83.10%.
[ Wed Sep 28 04:59:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:59:54 2022 ] Eval epoch: 45
[ Wed Sep 28 05:00:28 2022 ] 	Mean test loss of 296 batches: 0.5872613656762484.
[ Wed Sep 28 05:00:28 2022 ] 	Top1: 81.61%
[ Wed Sep 28 05:00:28 2022 ] 	Top5: 97.37%
[ Wed Sep 28 05:00:28 2022 ] Training epoch: 46
[ Wed Sep 28 05:03:26 2022 ] 	Mean training loss: 0.5236. loss2: 0.0000. Mean training acc: 83.35%.
[ Wed Sep 28 05:03:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:03:26 2022 ] Eval epoch: 46
[ Wed Sep 28 05:04:00 2022 ] 	Mean test loss of 296 batches: 0.5204971604351256.
[ Wed Sep 28 05:04:00 2022 ] 	Top1: 84.01%
[ Wed Sep 28 05:04:01 2022 ] 	Top5: 97.74%
[ Wed Sep 28 05:04:01 2022 ] Training epoch: 47
[ Wed Sep 28 05:06:58 2022 ] 	Mean training loss: 0.5241. loss2: 0.0000. Mean training acc: 83.52%.
[ Wed Sep 28 05:06:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:06:59 2022 ] Eval epoch: 47
[ Wed Sep 28 05:07:32 2022 ] 	Mean test loss of 296 batches: 0.5695460730710545.
[ Wed Sep 28 05:07:32 2022 ] 	Top1: 82.70%
[ Wed Sep 28 05:07:32 2022 ] 	Top5: 97.68%
[ Wed Sep 28 05:07:32 2022 ] Training epoch: 48
[ Wed Sep 28 05:10:30 2022 ] 	Mean training loss: 0.5347. loss2: 0.0000. Mean training acc: 83.18%.
[ Wed Sep 28 05:10:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:10:30 2022 ] Eval epoch: 48
[ Wed Sep 28 05:11:04 2022 ] 	Mean test loss of 296 batches: 0.5286743784877094.
[ Wed Sep 28 05:11:04 2022 ] 	Top1: 83.50%
[ Wed Sep 28 05:11:04 2022 ] 	Top5: 97.76%
[ Wed Sep 28 05:11:04 2022 ] Training epoch: 49
[ Wed Sep 28 05:14:02 2022 ] 	Mean training loss: 0.5279. loss2: 0.0000. Mean training acc: 83.25%.
[ Wed Sep 28 05:14:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:14:02 2022 ] Eval epoch: 49
[ Wed Sep 28 05:14:36 2022 ] 	Mean test loss of 296 batches: 0.503351895642039.
[ Wed Sep 28 05:14:36 2022 ] 	Top1: 84.22%
[ Wed Sep 28 05:14:36 2022 ] 	Top5: 98.34%
[ Wed Sep 28 05:14:36 2022 ] Training epoch: 50
[ Wed Sep 28 05:17:34 2022 ] 	Mean training loss: 0.5281. loss2: 0.0000. Mean training acc: 83.25%.
[ Wed Sep 28 05:17:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:17:34 2022 ] Eval epoch: 50
[ Wed Sep 28 05:18:07 2022 ] 	Mean test loss of 296 batches: 0.6874342287915784.
[ Wed Sep 28 05:18:08 2022 ] 	Top1: 78.81%
[ Wed Sep 28 05:18:08 2022 ] 	Top5: 97.11%
[ Wed Sep 28 05:18:08 2022 ] Training epoch: 51
[ Wed Sep 28 05:21:06 2022 ] 	Mean training loss: 0.5292. loss2: 0.0000. Mean training acc: 83.25%.
[ Wed Sep 28 05:21:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:21:06 2022 ] Eval epoch: 51
[ Wed Sep 28 05:21:40 2022 ] 	Mean test loss of 296 batches: 0.8215979694111927.
[ Wed Sep 28 05:21:40 2022 ] 	Top1: 76.00%
[ Wed Sep 28 05:21:40 2022 ] 	Top5: 95.58%
[ Wed Sep 28 05:21:40 2022 ] Training epoch: 52
[ Wed Sep 28 05:24:39 2022 ] 	Mean training loss: 0.5248. loss2: 0.0000. Mean training acc: 83.50%.
[ Wed Sep 28 05:24:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:24:39 2022 ] Eval epoch: 52
[ Wed Sep 28 05:25:13 2022 ] 	Mean test loss of 296 batches: 0.7911335954795012.
[ Wed Sep 28 05:25:13 2022 ] 	Top1: 76.80%
[ Wed Sep 28 05:25:13 2022 ] 	Top5: 96.16%
[ Wed Sep 28 05:25:13 2022 ] Training epoch: 53
[ Wed Sep 28 05:28:11 2022 ] 	Mean training loss: 0.5243. loss2: 0.0000. Mean training acc: 83.47%.
[ Wed Sep 28 05:28:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:28:11 2022 ] Eval epoch: 53
[ Wed Sep 28 05:28:44 2022 ] 	Mean test loss of 296 batches: 0.44695120314891273.
[ Wed Sep 28 05:28:44 2022 ] 	Top1: 85.70%
[ Wed Sep 28 05:28:45 2022 ] 	Top5: 98.24%
[ Wed Sep 28 05:28:45 2022 ] Training epoch: 54
[ Wed Sep 28 05:31:42 2022 ] 	Mean training loss: 0.5240. loss2: 0.0000. Mean training acc: 83.42%.
[ Wed Sep 28 05:31:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:31:43 2022 ] Eval epoch: 54
[ Wed Sep 28 05:32:16 2022 ] 	Mean test loss of 296 batches: 0.5459148634627864.
[ Wed Sep 28 05:32:16 2022 ] 	Top1: 82.97%
[ Wed Sep 28 05:32:16 2022 ] 	Top5: 98.02%
[ Wed Sep 28 05:32:16 2022 ] Training epoch: 55
[ Wed Sep 28 05:35:15 2022 ] 	Mean training loss: 0.5236. loss2: 0.0000. Mean training acc: 83.50%.
[ Wed Sep 28 05:35:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:35:15 2022 ] Eval epoch: 55
[ Wed Sep 28 05:35:49 2022 ] 	Mean test loss of 296 batches: 0.5545642412192112.
[ Wed Sep 28 05:35:49 2022 ] 	Top1: 82.91%
[ Wed Sep 28 05:35:49 2022 ] 	Top5: 97.31%
[ Wed Sep 28 05:35:49 2022 ] Training epoch: 56
[ Wed Sep 28 05:38:47 2022 ] 	Mean training loss: 0.5169. loss2: 0.0000. Mean training acc: 83.59%.
[ Wed Sep 28 05:38:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:38:47 2022 ] Eval epoch: 56
[ Wed Sep 28 05:39:21 2022 ] 	Mean test loss of 296 batches: 0.557610997527435.
[ Wed Sep 28 05:39:21 2022 ] 	Top1: 83.25%
[ Wed Sep 28 05:39:21 2022 ] 	Top5: 97.42%
[ Wed Sep 28 05:39:21 2022 ] Training epoch: 57
[ Wed Sep 28 05:42:19 2022 ] 	Mean training loss: 0.5226. loss2: 0.0000. Mean training acc: 83.71%.
[ Wed Sep 28 05:42:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:42:19 2022 ] Eval epoch: 57
[ Wed Sep 28 05:42:53 2022 ] 	Mean test loss of 296 batches: 0.6222666678195065.
[ Wed Sep 28 05:42:53 2022 ] 	Top1: 80.95%
[ Wed Sep 28 05:42:53 2022 ] 	Top5: 97.35%
[ Wed Sep 28 05:42:53 2022 ] Training epoch: 58
[ Wed Sep 28 05:45:50 2022 ] 	Mean training loss: 0.5220. loss2: 0.0000. Mean training acc: 83.20%.
[ Wed Sep 28 05:45:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:45:50 2022 ] Eval epoch: 58
[ Wed Sep 28 05:46:24 2022 ] 	Mean test loss of 296 batches: 1.562163891824516.
[ Wed Sep 28 05:46:24 2022 ] 	Top1: 64.53%
[ Wed Sep 28 05:46:24 2022 ] 	Top5: 91.43%
[ Wed Sep 28 05:46:24 2022 ] Training epoch: 59
[ Wed Sep 28 05:49:22 2022 ] 	Mean training loss: 0.5256. loss2: 0.0000. Mean training acc: 83.32%.
[ Wed Sep 28 05:49:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:49:22 2022 ] Eval epoch: 59
[ Wed Sep 28 05:49:56 2022 ] 	Mean test loss of 296 batches: 0.44431971756086963.
[ Wed Sep 28 05:49:56 2022 ] 	Top1: 85.61%
[ Wed Sep 28 05:49:56 2022 ] 	Top5: 98.36%
[ Wed Sep 28 05:49:56 2022 ] Training epoch: 60
[ Wed Sep 28 05:52:54 2022 ] 	Mean training loss: 0.5238. loss2: 0.0000. Mean training acc: 83.39%.
[ Wed Sep 28 05:52:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:52:54 2022 ] Eval epoch: 60
[ Wed Sep 28 05:53:28 2022 ] 	Mean test loss of 296 batches: 0.526747160817723.
[ Wed Sep 28 05:53:28 2022 ] 	Top1: 83.18%
[ Wed Sep 28 05:53:28 2022 ] 	Top5: 98.10%
[ Wed Sep 28 05:53:28 2022 ] Training epoch: 61
[ Wed Sep 28 05:56:26 2022 ] 	Mean training loss: 0.5278. loss2: 0.0000. Mean training acc: 83.32%.
[ Wed Sep 28 05:56:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:56:26 2022 ] Eval epoch: 61
[ Wed Sep 28 05:57:00 2022 ] 	Mean test loss of 296 batches: 0.4770082738753912.
[ Wed Sep 28 05:57:00 2022 ] 	Top1: 84.80%
[ Wed Sep 28 05:57:00 2022 ] 	Top5: 98.54%
[ Wed Sep 28 05:57:00 2022 ] Training epoch: 62
[ Wed Sep 28 05:59:58 2022 ] 	Mean training loss: 0.5203. loss2: 0.0000. Mean training acc: 83.47%.
[ Wed Sep 28 05:59:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:59:58 2022 ] Eval epoch: 62
[ Wed Sep 28 06:00:32 2022 ] 	Mean test loss of 296 batches: 0.6500772977801593.
[ Wed Sep 28 06:00:32 2022 ] 	Top1: 80.22%
[ Wed Sep 28 06:00:32 2022 ] 	Top5: 96.95%
[ Wed Sep 28 06:00:32 2022 ] Training epoch: 63
[ Wed Sep 28 06:03:30 2022 ] 	Mean training loss: 0.5187. loss2: 0.0000. Mean training acc: 83.55%.
[ Wed Sep 28 06:03:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:03:30 2022 ] Eval epoch: 63
[ Wed Sep 28 06:04:04 2022 ] 	Mean test loss of 296 batches: 0.42069905205956987.
[ Wed Sep 28 06:04:04 2022 ] 	Top1: 86.26%
[ Wed Sep 28 06:04:04 2022 ] 	Top5: 98.46%
[ Wed Sep 28 06:04:04 2022 ] Training epoch: 64
[ Wed Sep 28 06:07:02 2022 ] 	Mean training loss: 0.5159. loss2: 0.0000. Mean training acc: 83.57%.
[ Wed Sep 28 06:07:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:07:02 2022 ] Eval epoch: 64
[ Wed Sep 28 06:07:35 2022 ] 	Mean test loss of 296 batches: 0.5100538340472692.
[ Wed Sep 28 06:07:35 2022 ] 	Top1: 83.74%
[ Wed Sep 28 06:07:36 2022 ] 	Top5: 98.04%
[ Wed Sep 28 06:07:36 2022 ] Training epoch: 65
[ Wed Sep 28 06:10:33 2022 ] 	Mean training loss: 0.5148. loss2: 0.0000. Mean training acc: 83.82%.
[ Wed Sep 28 06:10:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:10:34 2022 ] Eval epoch: 65
[ Wed Sep 28 06:11:07 2022 ] 	Mean test loss of 296 batches: 0.5800760434688749.
[ Wed Sep 28 06:11:07 2022 ] 	Top1: 81.77%
[ Wed Sep 28 06:11:07 2022 ] 	Top5: 97.67%
[ Wed Sep 28 06:11:07 2022 ] Training epoch: 66
[ Wed Sep 28 06:14:06 2022 ] 	Mean training loss: 0.5165. loss2: 0.0000. Mean training acc: 83.69%.
[ Wed Sep 28 06:14:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:14:06 2022 ] Eval epoch: 66
[ Wed Sep 28 06:14:39 2022 ] 	Mean test loss of 296 batches: 0.6167843479763817.
[ Wed Sep 28 06:14:39 2022 ] 	Top1: 80.96%
[ Wed Sep 28 06:14:39 2022 ] 	Top5: 98.12%
[ Wed Sep 28 06:14:39 2022 ] Training epoch: 67
[ Wed Sep 28 06:17:38 2022 ] 	Mean training loss: 0.5155. loss2: 0.0000. Mean training acc: 83.82%.
[ Wed Sep 28 06:17:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:17:38 2022 ] Eval epoch: 67
[ Wed Sep 28 06:18:11 2022 ] 	Mean test loss of 296 batches: 0.5056485425982926.
[ Wed Sep 28 06:18:11 2022 ] 	Top1: 83.95%
[ Wed Sep 28 06:18:11 2022 ] 	Top5: 98.14%
[ Wed Sep 28 06:18:11 2022 ] Training epoch: 68
[ Wed Sep 28 06:21:09 2022 ] 	Mean training loss: 0.5158. loss2: 0.0000. Mean training acc: 83.72%.
[ Wed Sep 28 06:21:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:21:09 2022 ] Eval epoch: 68
[ Wed Sep 28 06:21:43 2022 ] 	Mean test loss of 296 batches: 0.45217381212608637.
[ Wed Sep 28 06:21:43 2022 ] 	Top1: 85.83%
[ Wed Sep 28 06:21:43 2022 ] 	Top5: 98.32%
[ Wed Sep 28 06:21:43 2022 ] Training epoch: 69
[ Wed Sep 28 06:24:41 2022 ] 	Mean training loss: 0.5144. loss2: 0.0000. Mean training acc: 83.77%.
[ Wed Sep 28 06:24:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:24:41 2022 ] Eval epoch: 69
[ Wed Sep 28 06:25:15 2022 ] 	Mean test loss of 296 batches: 0.44454972649848945.
[ Wed Sep 28 06:25:15 2022 ] 	Top1: 85.94%
[ Wed Sep 28 06:25:15 2022 ] 	Top5: 98.58%
[ Wed Sep 28 06:25:15 2022 ] Training epoch: 70
[ Wed Sep 28 06:28:13 2022 ] 	Mean training loss: 0.5263. loss2: 0.0000. Mean training acc: 83.38%.
[ Wed Sep 28 06:28:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:28:13 2022 ] Eval epoch: 70
[ Wed Sep 28 06:28:46 2022 ] 	Mean test loss of 296 batches: 0.5684509831103118.
[ Wed Sep 28 06:28:46 2022 ] 	Top1: 82.30%
[ Wed Sep 28 06:28:47 2022 ] 	Top5: 97.67%
[ Wed Sep 28 06:28:47 2022 ] Training epoch: 71
[ Wed Sep 28 06:31:45 2022 ] 	Mean training loss: 0.5104. loss2: 0.0000. Mean training acc: 83.77%.
[ Wed Sep 28 06:31:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:31:45 2022 ] Eval epoch: 71
[ Wed Sep 28 06:32:18 2022 ] 	Mean test loss of 296 batches: 0.6073233747502437.
[ Wed Sep 28 06:32:18 2022 ] 	Top1: 81.20%
[ Wed Sep 28 06:32:18 2022 ] 	Top5: 97.07%
[ Wed Sep 28 06:32:18 2022 ] Training epoch: 72
[ Wed Sep 28 06:35:16 2022 ] 	Mean training loss: 0.5179. loss2: 0.0000. Mean training acc: 83.51%.
[ Wed Sep 28 06:35:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:35:16 2022 ] Eval epoch: 72
[ Wed Sep 28 06:35:50 2022 ] 	Mean test loss of 296 batches: 0.4771091787496934.
[ Wed Sep 28 06:35:50 2022 ] 	Top1: 85.07%
[ Wed Sep 28 06:35:50 2022 ] 	Top5: 98.20%
[ Wed Sep 28 06:35:50 2022 ] Training epoch: 73
[ Wed Sep 28 06:38:48 2022 ] 	Mean training loss: 0.5131. loss2: 0.0000. Mean training acc: 83.71%.
[ Wed Sep 28 06:38:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:38:49 2022 ] Eval epoch: 73
[ Wed Sep 28 06:39:22 2022 ] 	Mean test loss of 296 batches: 0.6469464971608406.
[ Wed Sep 28 06:39:22 2022 ] 	Top1: 79.89%
[ Wed Sep 28 06:39:22 2022 ] 	Top5: 96.69%
[ Wed Sep 28 06:39:22 2022 ] Training epoch: 74
[ Wed Sep 28 06:42:20 2022 ] 	Mean training loss: 0.5104. loss2: 0.0000. Mean training acc: 84.00%.
[ Wed Sep 28 06:42:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:42:20 2022 ] Eval epoch: 74
[ Wed Sep 28 06:42:54 2022 ] 	Mean test loss of 296 batches: 0.505168699483211.
[ Wed Sep 28 06:42:54 2022 ] 	Top1: 83.75%
[ Wed Sep 28 06:42:54 2022 ] 	Top5: 98.21%
[ Wed Sep 28 06:42:54 2022 ] Training epoch: 75
[ Wed Sep 28 06:45:52 2022 ] 	Mean training loss: 0.5111. loss2: 0.0000. Mean training acc: 83.91%.
[ Wed Sep 28 06:45:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:45:52 2022 ] Eval epoch: 75
[ Wed Sep 28 06:46:26 2022 ] 	Mean test loss of 296 batches: 0.4778377721337853.
[ Wed Sep 28 06:46:26 2022 ] 	Top1: 84.59%
[ Wed Sep 28 06:46:26 2022 ] 	Top5: 98.35%
[ Wed Sep 28 06:46:26 2022 ] Training epoch: 76
[ Wed Sep 28 06:49:24 2022 ] 	Mean training loss: 0.5157. loss2: 0.0000. Mean training acc: 83.64%.
[ Wed Sep 28 06:49:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:49:24 2022 ] Eval epoch: 76
[ Wed Sep 28 06:49:58 2022 ] 	Mean test loss of 296 batches: 0.41583229339606054.
[ Wed Sep 28 06:49:58 2022 ] 	Top1: 86.64%
[ Wed Sep 28 06:49:58 2022 ] 	Top5: 98.51%
[ Wed Sep 28 06:49:58 2022 ] Training epoch: 77
[ Wed Sep 28 06:52:56 2022 ] 	Mean training loss: 0.5044. loss2: 0.0000. Mean training acc: 84.05%.
[ Wed Sep 28 06:52:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:52:56 2022 ] Eval epoch: 77
[ Wed Sep 28 06:53:30 2022 ] 	Mean test loss of 296 batches: 0.7462633654877946.
[ Wed Sep 28 06:53:30 2022 ] 	Top1: 77.31%
[ Wed Sep 28 06:53:30 2022 ] 	Top5: 96.12%
[ Wed Sep 28 06:53:30 2022 ] Training epoch: 78
[ Wed Sep 28 06:56:28 2022 ] 	Mean training loss: 0.5027. loss2: 0.0000. Mean training acc: 84.17%.
[ Wed Sep 28 06:56:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:56:28 2022 ] Eval epoch: 78
[ Wed Sep 28 06:57:02 2022 ] 	Mean test loss of 296 batches: 0.5369063204003347.
[ Wed Sep 28 06:57:02 2022 ] 	Top1: 83.10%
[ Wed Sep 28 06:57:02 2022 ] 	Top5: 97.80%
[ Wed Sep 28 06:57:02 2022 ] Training epoch: 79
[ Wed Sep 28 07:00:00 2022 ] 	Mean training loss: 0.5079. loss2: 0.0000. Mean training acc: 83.86%.
[ Wed Sep 28 07:00:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:00:00 2022 ] Eval epoch: 79
[ Wed Sep 28 07:00:34 2022 ] 	Mean test loss of 296 batches: 0.43087392219820536.
[ Wed Sep 28 07:00:34 2022 ] 	Top1: 86.56%
[ Wed Sep 28 07:00:34 2022 ] 	Top5: 98.22%
[ Wed Sep 28 07:00:35 2022 ] Training epoch: 80
[ Wed Sep 28 07:03:32 2022 ] 	Mean training loss: 0.5087. loss2: 0.0000. Mean training acc: 83.79%.
[ Wed Sep 28 07:03:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:03:32 2022 ] Eval epoch: 80
[ Wed Sep 28 07:04:06 2022 ] 	Mean test loss of 296 batches: 0.42286490052435044.
[ Wed Sep 28 07:04:06 2022 ] 	Top1: 86.77%
[ Wed Sep 28 07:04:06 2022 ] 	Top5: 98.61%
[ Wed Sep 28 07:04:06 2022 ] Training epoch: 81
[ Wed Sep 28 07:07:04 2022 ] 	Mean training loss: 0.5059. loss2: 0.0000. Mean training acc: 83.97%.
[ Wed Sep 28 07:07:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:07:05 2022 ] Eval epoch: 81
[ Wed Sep 28 07:07:38 2022 ] 	Mean test loss of 296 batches: 0.5119432708298838.
[ Wed Sep 28 07:07:38 2022 ] 	Top1: 84.00%
[ Wed Sep 28 07:07:38 2022 ] 	Top5: 98.21%
[ Wed Sep 28 07:07:38 2022 ] Training epoch: 82
[ Wed Sep 28 07:10:36 2022 ] 	Mean training loss: 0.5029. loss2: 0.0000. Mean training acc: 84.14%.
[ Wed Sep 28 07:10:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:10:36 2022 ] Eval epoch: 82
[ Wed Sep 28 07:11:10 2022 ] 	Mean test loss of 296 batches: 0.4967854123663258.
[ Wed Sep 28 07:11:10 2022 ] 	Top1: 83.95%
[ Wed Sep 28 07:11:10 2022 ] 	Top5: 98.16%
[ Wed Sep 28 07:11:10 2022 ] Training epoch: 83
[ Wed Sep 28 07:14:08 2022 ] 	Mean training loss: 0.5110. loss2: 0.0000. Mean training acc: 83.95%.
[ Wed Sep 28 07:14:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:14:08 2022 ] Eval epoch: 83
[ Wed Sep 28 07:14:42 2022 ] 	Mean test loss of 296 batches: 0.42879360697760777.
[ Wed Sep 28 07:14:42 2022 ] 	Top1: 86.08%
[ Wed Sep 28 07:14:42 2022 ] 	Top5: 98.63%
[ Wed Sep 28 07:14:42 2022 ] Training epoch: 84
[ Wed Sep 28 07:17:40 2022 ] 	Mean training loss: 0.5120. loss2: 0.0000. Mean training acc: 83.80%.
[ Wed Sep 28 07:17:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:17:40 2022 ] Eval epoch: 84
[ Wed Sep 28 07:18:13 2022 ] 	Mean test loss of 296 batches: 0.5979791719567131.
[ Wed Sep 28 07:18:14 2022 ] 	Top1: 81.41%
[ Wed Sep 28 07:18:14 2022 ] 	Top5: 97.86%
[ Wed Sep 28 07:18:14 2022 ] Training epoch: 85
[ Wed Sep 28 07:21:11 2022 ] 	Mean training loss: 0.5100. loss2: 0.0000. Mean training acc: 83.92%.
[ Wed Sep 28 07:21:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:21:12 2022 ] Eval epoch: 85
[ Wed Sep 28 07:21:45 2022 ] 	Mean test loss of 296 batches: 1.0820565333438885.
[ Wed Sep 28 07:21:45 2022 ] 	Top1: 69.31%
[ Wed Sep 28 07:21:45 2022 ] 	Top5: 92.85%
[ Wed Sep 28 07:21:45 2022 ] Training epoch: 86
[ Wed Sep 28 07:24:43 2022 ] 	Mean training loss: 0.5133. loss2: 0.0000. Mean training acc: 83.88%.
[ Wed Sep 28 07:24:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:24:43 2022 ] Eval epoch: 86
[ Wed Sep 28 07:25:17 2022 ] 	Mean test loss of 296 batches: 0.45851088385726957.
[ Wed Sep 28 07:25:17 2022 ] 	Top1: 85.36%
[ Wed Sep 28 07:25:17 2022 ] 	Top5: 98.30%
[ Wed Sep 28 07:25:17 2022 ] Training epoch: 87
[ Wed Sep 28 07:28:15 2022 ] 	Mean training loss: 0.5086. loss2: 0.0000. Mean training acc: 84.00%.
[ Wed Sep 28 07:28:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:28:15 2022 ] Eval epoch: 87
[ Wed Sep 28 07:28:49 2022 ] 	Mean test loss of 296 batches: 0.4870672011747956.
[ Wed Sep 28 07:28:49 2022 ] 	Top1: 84.48%
[ Wed Sep 28 07:28:49 2022 ] 	Top5: 98.10%
[ Wed Sep 28 07:28:49 2022 ] Training epoch: 88
[ Wed Sep 28 07:31:47 2022 ] 	Mean training loss: 0.4996. loss2: 0.0000. Mean training acc: 84.13%.
[ Wed Sep 28 07:31:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:31:48 2022 ] Eval epoch: 88
[ Wed Sep 28 07:32:21 2022 ] 	Mean test loss of 296 batches: 0.5611201071155232.
[ Wed Sep 28 07:32:21 2022 ] 	Top1: 82.58%
[ Wed Sep 28 07:32:21 2022 ] 	Top5: 97.89%
[ Wed Sep 28 07:32:21 2022 ] Training epoch: 89
[ Wed Sep 28 07:35:19 2022 ] 	Mean training loss: 0.5010. loss2: 0.0000. Mean training acc: 84.10%.
[ Wed Sep 28 07:35:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:35:19 2022 ] Eval epoch: 89
[ Wed Sep 28 07:35:53 2022 ] 	Mean test loss of 296 batches: 0.5508064797299134.
[ Wed Sep 28 07:35:53 2022 ] 	Top1: 82.14%
[ Wed Sep 28 07:35:53 2022 ] 	Top5: 98.21%
[ Wed Sep 28 07:35:53 2022 ] Training epoch: 90
[ Wed Sep 28 07:38:51 2022 ] 	Mean training loss: 0.5028. loss2: 0.0000. Mean training acc: 84.30%.
[ Wed Sep 28 07:38:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:38:51 2022 ] Eval epoch: 90
[ Wed Sep 28 07:39:25 2022 ] 	Mean test loss of 296 batches: 0.49458274990320206.
[ Wed Sep 28 07:39:25 2022 ] 	Top1: 84.04%
[ Wed Sep 28 07:39:25 2022 ] 	Top5: 98.11%
[ Wed Sep 28 07:39:25 2022 ] Training epoch: 91
[ Wed Sep 28 07:42:23 2022 ] 	Mean training loss: 0.3054. loss2: 0.0000. Mean training acc: 90.52%.
[ Wed Sep 28 07:42:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:42:23 2022 ] Eval epoch: 91
[ Wed Sep 28 07:42:57 2022 ] 	Mean test loss of 296 batches: 0.21424346116396623.
[ Wed Sep 28 07:42:57 2022 ] 	Top1: 93.04%
[ Wed Sep 28 07:42:57 2022 ] 	Top5: 99.29%
[ Wed Sep 28 07:42:57 2022 ] Training epoch: 92
[ Wed Sep 28 07:45:55 2022 ] 	Mean training loss: 0.2447. loss2: 0.0000. Mean training acc: 92.30%.
[ Wed Sep 28 07:45:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:45:55 2022 ] Eval epoch: 92
[ Wed Sep 28 07:46:28 2022 ] 	Mean test loss of 296 batches: 0.20008971959956595.
[ Wed Sep 28 07:46:28 2022 ] 	Top1: 93.62%
[ Wed Sep 28 07:46:29 2022 ] 	Top5: 99.35%
[ Wed Sep 28 07:46:29 2022 ] Training epoch: 93
[ Wed Sep 28 07:49:27 2022 ] 	Mean training loss: 0.2205. loss2: 0.0000. Mean training acc: 93.15%.
[ Wed Sep 28 07:49:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:49:27 2022 ] Eval epoch: 93
[ Wed Sep 28 07:50:00 2022 ] 	Mean test loss of 296 batches: 0.19434332177420524.
[ Wed Sep 28 07:50:01 2022 ] 	Top1: 93.75%
[ Wed Sep 28 07:50:01 2022 ] 	Top5: 99.34%
[ Wed Sep 28 07:50:01 2022 ] Training epoch: 94
[ Wed Sep 28 07:52:58 2022 ] 	Mean training loss: 0.2012. loss2: 0.0000. Mean training acc: 93.76%.
[ Wed Sep 28 07:52:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:52:59 2022 ] Eval epoch: 94
[ Wed Sep 28 07:53:32 2022 ] 	Mean test loss of 296 batches: 0.19127302370160013.
[ Wed Sep 28 07:53:32 2022 ] 	Top1: 93.89%
[ Wed Sep 28 07:53:32 2022 ] 	Top5: 99.38%
[ Wed Sep 28 07:53:33 2022 ] Training epoch: 95
[ Wed Sep 28 07:56:30 2022 ] 	Mean training loss: 0.1893. loss2: 0.0000. Mean training acc: 94.03%.
[ Wed Sep 28 07:56:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:56:30 2022 ] Eval epoch: 95
[ Wed Sep 28 07:57:04 2022 ] 	Mean test loss of 296 batches: 0.18424523921331037.
[ Wed Sep 28 07:57:04 2022 ] 	Top1: 93.99%
[ Wed Sep 28 07:57:04 2022 ] 	Top5: 99.40%
[ Wed Sep 28 07:57:04 2022 ] Training epoch: 96
[ Wed Sep 28 08:00:02 2022 ] 	Mean training loss: 0.1752. loss2: 0.0000. Mean training acc: 94.69%.
[ Wed Sep 28 08:00:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:00:02 2022 ] Eval epoch: 96
[ Wed Sep 28 08:00:36 2022 ] 	Mean test loss of 296 batches: 0.1803273815359618.
[ Wed Sep 28 08:00:36 2022 ] 	Top1: 94.22%
[ Wed Sep 28 08:00:36 2022 ] 	Top5: 99.34%
[ Wed Sep 28 08:00:36 2022 ] Training epoch: 97
[ Wed Sep 28 08:03:34 2022 ] 	Mean training loss: 0.1666. loss2: 0.0000. Mean training acc: 94.85%.
[ Wed Sep 28 08:03:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:03:35 2022 ] Eval epoch: 97
[ Wed Sep 28 08:04:08 2022 ] 	Mean test loss of 296 batches: 0.18944144595062007.
[ Wed Sep 28 08:04:08 2022 ] 	Top1: 94.00%
[ Wed Sep 28 08:04:08 2022 ] 	Top5: 99.29%
[ Wed Sep 28 08:04:08 2022 ] Training epoch: 98
[ Wed Sep 28 08:07:06 2022 ] 	Mean training loss: 0.1579. loss2: 0.0000. Mean training acc: 95.19%.
[ Wed Sep 28 08:07:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:07:06 2022 ] Eval epoch: 98
[ Wed Sep 28 08:07:40 2022 ] 	Mean test loss of 296 batches: 0.18534595115311644.
[ Wed Sep 28 08:07:40 2022 ] 	Top1: 94.13%
[ Wed Sep 28 08:07:40 2022 ] 	Top5: 99.32%
[ Wed Sep 28 08:07:40 2022 ] Training epoch: 99
[ Wed Sep 28 08:10:39 2022 ] 	Mean training loss: 0.1508. loss2: 0.0000. Mean training acc: 95.36%.
[ Wed Sep 28 08:10:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:10:39 2022 ] Eval epoch: 99
[ Wed Sep 28 08:11:13 2022 ] 	Mean test loss of 296 batches: 0.18752058254982773.
[ Wed Sep 28 08:11:13 2022 ] 	Top1: 94.23%
[ Wed Sep 28 08:11:13 2022 ] 	Top5: 99.36%
[ Wed Sep 28 08:11:13 2022 ] Training epoch: 100
[ Wed Sep 28 08:14:11 2022 ] 	Mean training loss: 0.1423. loss2: 0.0000. Mean training acc: 95.69%.
[ Wed Sep 28 08:14:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:14:11 2022 ] Eval epoch: 100
[ Wed Sep 28 08:14:45 2022 ] 	Mean test loss of 296 batches: 0.2071267983599289.
[ Wed Sep 28 08:14:45 2022 ] 	Top1: 93.42%
[ Wed Sep 28 08:14:45 2022 ] 	Top5: 99.29%
[ Wed Sep 28 08:14:45 2022 ] Training epoch: 101
[ Wed Sep 28 08:17:43 2022 ] 	Mean training loss: 0.1183. loss2: 0.0000. Mean training acc: 96.64%.
[ Wed Sep 28 08:17:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:17:43 2022 ] Eval epoch: 101
[ Wed Sep 28 08:18:16 2022 ] 	Mean test loss of 296 batches: 0.17530329461284988.
[ Wed Sep 28 08:18:16 2022 ] 	Top1: 94.52%
[ Wed Sep 28 08:18:17 2022 ] 	Top5: 99.36%
[ Wed Sep 28 08:18:17 2022 ] Training epoch: 102
[ Wed Sep 28 08:21:15 2022 ] 	Mean training loss: 0.1081. loss2: 0.0000. Mean training acc: 96.98%.
[ Wed Sep 28 08:21:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:21:15 2022 ] Eval epoch: 102
[ Wed Sep 28 08:21:49 2022 ] 	Mean test loss of 296 batches: 0.17568889878237168.
[ Wed Sep 28 08:21:49 2022 ] 	Top1: 94.55%
[ Wed Sep 28 08:21:49 2022 ] 	Top5: 99.36%
[ Wed Sep 28 08:21:49 2022 ] Training epoch: 103
[ Wed Sep 28 08:24:47 2022 ] 	Mean training loss: 0.1028. loss2: 0.0000. Mean training acc: 97.22%.
[ Wed Sep 28 08:24:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:24:47 2022 ] Eval epoch: 103
[ Wed Sep 28 08:25:21 2022 ] 	Mean test loss of 296 batches: 0.17425445092816813.
[ Wed Sep 28 08:25:21 2022 ] 	Top1: 94.61%
[ Wed Sep 28 08:25:21 2022 ] 	Top5: 99.36%
[ Wed Sep 28 08:25:21 2022 ] Training epoch: 104
[ Wed Sep 28 08:28:19 2022 ] 	Mean training loss: 0.0969. loss2: 0.0000. Mean training acc: 97.38%.
[ Wed Sep 28 08:28:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:28:19 2022 ] Eval epoch: 104
[ Wed Sep 28 08:28:52 2022 ] 	Mean test loss of 296 batches: 0.1741077038774116.
[ Wed Sep 28 08:28:53 2022 ] 	Top1: 94.59%
[ Wed Sep 28 08:28:53 2022 ] 	Top5: 99.37%
[ Wed Sep 28 08:28:53 2022 ] Training epoch: 105
[ Wed Sep 28 08:31:50 2022 ] 	Mean training loss: 0.0956. loss2: 0.0000. Mean training acc: 97.55%.
[ Wed Sep 28 08:31:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:31:50 2022 ] Eval epoch: 105
[ Wed Sep 28 08:32:24 2022 ] 	Mean test loss of 296 batches: 0.17691492876415518.
[ Wed Sep 28 08:32:24 2022 ] 	Top1: 94.53%
[ Wed Sep 28 08:32:24 2022 ] 	Top5: 99.33%
[ Wed Sep 28 08:32:24 2022 ] Training epoch: 106
[ Wed Sep 28 08:35:23 2022 ] 	Mean training loss: 0.0925. loss2: 0.0000. Mean training acc: 97.57%.
[ Wed Sep 28 08:35:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:35:23 2022 ] Eval epoch: 106
[ Wed Sep 28 08:35:56 2022 ] 	Mean test loss of 296 batches: 0.17666297893014712.
[ Wed Sep 28 08:35:56 2022 ] 	Top1: 94.41%
[ Wed Sep 28 08:35:56 2022 ] 	Top5: 99.32%
[ Wed Sep 28 08:35:56 2022 ] Training epoch: 107
[ Wed Sep 28 08:38:54 2022 ] 	Mean training loss: 0.0890. loss2: 0.0000. Mean training acc: 97.71%.
[ Wed Sep 28 08:38:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:38:54 2022 ] Eval epoch: 107
[ Wed Sep 28 08:39:28 2022 ] 	Mean test loss of 296 batches: 0.17566952670018213.
[ Wed Sep 28 08:39:28 2022 ] 	Top1: 94.60%
[ Wed Sep 28 08:39:28 2022 ] 	Top5: 99.34%
[ Wed Sep 28 08:39:28 2022 ] Training epoch: 108
[ Wed Sep 28 08:42:26 2022 ] 	Mean training loss: 0.0889. loss2: 0.0000. Mean training acc: 97.61%.
[ Wed Sep 28 08:42:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:42:26 2022 ] Eval epoch: 108
[ Wed Sep 28 08:43:00 2022 ] 	Mean test loss of 296 batches: 0.17621354759368743.
[ Wed Sep 28 08:43:00 2022 ] 	Top1: 94.60%
[ Wed Sep 28 08:43:00 2022 ] 	Top5: 99.32%
[ Wed Sep 28 08:43:00 2022 ] Training epoch: 109
[ Wed Sep 28 08:45:58 2022 ] 	Mean training loss: 0.0881. loss2: 0.0000. Mean training acc: 97.64%.
[ Wed Sep 28 08:45:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:45:58 2022 ] Eval epoch: 109
[ Wed Sep 28 08:46:31 2022 ] 	Mean test loss of 296 batches: 0.17804291179227466.
[ Wed Sep 28 08:46:32 2022 ] 	Top1: 94.49%
[ Wed Sep 28 08:46:32 2022 ] 	Top5: 99.31%
[ Wed Sep 28 08:46:32 2022 ] Training epoch: 110
[ Wed Sep 28 08:49:30 2022 ] 	Mean training loss: 0.0850. loss2: 0.0000. Mean training acc: 97.77%.
[ Wed Sep 28 08:49:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:49:30 2022 ] Eval epoch: 110
[ Wed Sep 28 08:50:03 2022 ] 	Mean test loss of 296 batches: 0.17942565672942815.
[ Wed Sep 28 08:50:03 2022 ] 	Top1: 94.38%
[ Wed Sep 28 08:50:03 2022 ] 	Top5: 99.33%
[ Wed Sep 28 08:50:37 2022 ] Best accuracy: 0.946122966406085
[ Wed Sep 28 08:50:37 2022 ] Epoch number: 103
[ Wed Sep 28 08:50:37 2022 ] Model name: work_dir/ntu60/cview/fc_joint
[ Wed Sep 28 08:50:37 2022 ] Model total number of params: 2082097
[ Wed Sep 28 08:50:37 2022 ] Weight decay: 0.0004
[ Wed Sep 28 08:50:37 2022 ] Base LR: 0.1
[ Wed Sep 28 08:50:37 2022 ] Batch Size: 64
[ Wed Sep 28 08:50:37 2022 ] Test Batch Size: 64
[ Wed Sep 28 08:50:37 2022 ] seed: 1
